Skip to content

Conversation

@jsonbailey
Copy link
Contributor

@jsonbailey jsonbailey commented Oct 31, 2025

feat: Add support for tracking streaming text metics with getAIMetricsFromStream
feat: Renamed createAIMetrics to getAIMetricsFromResponse (previous method is marked as deprecated)
fix!: VercelProvider now requires type safe parameters for Vercel models
fix: Properly convert LD model parameters to Vercel model parameters
fix: Prefer totalUsage over usage when mapping to LDTokenUsage
fix: Check finishReason for an error when determining model success


Note

Adds toVercelAISDK, enforces typed Vercel model parameters with proper LD→Vercel mapping, and enhances usage/streaming metrics handling.

  • VercelProvider:
    • Add toVercelAISDK() to build Vercel AI SDK config (merges messages, maps params).
    • Enforce type-safe model parameters via VercelAIModelParameters; update constructor and create() to use mapParameters().
    • Implement mapParameters() to convert LD params (e.g., max_tokens, stop) to Vercel fields (maxTokens, stopSequences).
    • Improve metrics: add mapUsageDataToLDTokenUsage(), support totalUsage/usage, incorporate finishReason, and add createStreamMetrics() for streaming.
  • Types/Exports:
    • Introduce types.ts with VercelAIModelParameters, VercelAISDKConfig, VercelAISDKMapOptions, VercelAISDKProvider and export them from index.ts.

Written by Cursor Bugbot for commit 47a0875. This will update automatically on new commits. Configure here.

@jsonbailey jsonbailey requested a review from a team as a code owner October 31, 2025 02:09
fix!: VercelProvider now requires type safe parameters for Vercel models
fix: Properly convert LD model parameters to Vercel model parameters
@jsonbailey jsonbailey force-pushed the jb/sdk-1532/support-tovercelaisdk-methods-in-provider branch from b8e6700 to 9393389 Compare October 31, 2025 02:15
@jsonbailey jsonbailey changed the base branch from jb/sdk-1532/deprecate-tovercelaisdk to main October 31, 2025 02:15
@github-actions
Copy link
Contributor

@launchdarkly/browser size report
This is the brotli compressed size of the ESM build.
Compressed size: 169118 bytes
Compressed size limit: 200000
Uncompressed size: 789399 bytes

@github-actions
Copy link
Contributor

@launchdarkly/js-client-sdk size report
This is the brotli compressed size of the ESM build.
Compressed size: 21721 bytes
Compressed size limit: 25000
Uncompressed size: 74698 bytes

@github-actions
Copy link
Contributor

@launchdarkly/js-sdk-common size report
This is the brotli compressed size of the ESM build.
Compressed size: 24988 bytes
Compressed size limit: 26000
Uncompressed size: 122411 bytes

@github-actions
Copy link
Contributor

@launchdarkly/js-client-sdk-common size report
This is the brotli compressed size of the ESM build.
Compressed size: 17636 bytes
Compressed size limit: 20000
Uncompressed size: 90259 bytes

@jsonbailey jsonbailey merged commit 28d3650 into main Nov 4, 2025
33 checks passed
@jsonbailey jsonbailey deleted the jb/sdk-1532/support-tovercelaisdk-methods-in-provider branch November 4, 2025 19:08
@github-actions github-actions bot mentioned this pull request Nov 4, 2025
jsonbailey added a commit that referenced this pull request Nov 5, 2025
🤖 I have created a release *beep* *boop*
---


<details><summary>server-sdk-ai: 0.13.0</summary>

##
[0.13.0](server-sdk-ai-v0.12.3...server-sdk-ai-v0.13.0)
(2025-11-04)


### Features

* Add support for trackStreamMetricsOf method
([#971](#971))
([e18979e](e18979e))


### Bug Fixes

* Deprecated toVercelAISDK, trackVercelAISDKStreamTextMetrics, use
`@launchdarkly/server-sdk-ai-vercel` package
([e18979e](e18979e))
</details>

<details><summary>server-sdk-ai-langchain: 0.2.0</summary>

##
[0.2.0](server-sdk-ai-langchain-v0.1.3...server-sdk-ai-langchain-v0.2.0)
(2025-11-04)


### Features

* Renamed createAIMetrics to getAIMetricsFromResponse
([#977](#977))
([05b4667](05b4667))


### Dependencies

* The following workspace dependencies were updated
  * devDependencies
    * @launchdarkly/server-sdk-ai bumped from ^0.12.3 to ^0.13.0
</details>

<details><summary>server-sdk-ai-openai: 0.2.0</summary>

##
[0.2.0](server-sdk-ai-openai-v0.1.2...server-sdk-ai-openai-v0.2.0)
(2025-11-04)


### Features

* Renamed createAIMetrics to getAIMetricsFromResponse
([#977](#977))
([05b4667](05b4667))


### Dependencies

* The following workspace dependencies were updated
  * devDependencies
    * @launchdarkly/server-sdk-ai bumped from ^0.12.3 to ^0.13.0
</details>

<details><summary>server-sdk-ai-vercel: 0.2.0</summary>

##
[0.2.0](server-sdk-ai-vercel-v0.1.2...server-sdk-ai-vercel-v0.2.0)
(2025-11-04)


### ⚠ BREAKING CHANGES

* VercelProvider now requires type safe parameters for Vercel models

### Features

* Add support for tracking streaming text metics with
([28d3650](28d3650))
* Add toVercelAISDK method to support easy model creation
([#972](#972))
([28d3650](28d3650))
* Renamed createAIMetrics to getAIMetricsFromResponse
([#977](#977))
([05b4667](05b4667))


### Bug Fixes

* Check finishReason for an error when determining model success
([28d3650](28d3650))
* Prefer totalUsage over usage when mapping to LDTokenUsage
([28d3650](28d3650))
* Properly convert LD model parameters to Vercel model parameters
([28d3650](28d3650))
* VercelProvider now requires type safe parameters for Vercel models
([28d3650](28d3650))


### Dependencies

* The following workspace dependencies were updated
  * devDependencies
    * @launchdarkly/server-sdk-ai bumped from ^0.12.3 to ^0.13.0
</details>

---
This PR was generated with [Release
Please](https://github.com/googleapis/release-please). See
[documentation](https://github.com/googleapis/release-please#release-please).

<!-- CURSOR_SUMMARY -->
---

> [!NOTE]
> Release bumps: server AI SDK to 0.13.0 with stream metrics;
LangChain/OpenAI/Vercel providers to 0.2.0 including metric API rename
and Vercel type-safe params plus fixes.
> 
> - **AI SDK (`packages/sdk/server-ai`) — 0.13.0**
>   - Feature: add `trackStreamMetricsOf`.
> - Fix: deprecate `toVercelAISDK` and related helpers (moved to Vercel
provider).
> - **AI Providers — 0.2.0**
>   - `server-ai-langchain`/`server-ai-openai`:
>     - Rename `createAIMetrics` to `getAIMetricsFromResponse`.
>   - `server-ai-vercel`:
>     - Breaking: require type-safe params for Vercel models.
> - Features: streaming text metrics tracking; `toVercelAISDK` helper.
> - Fixes: check `finishReason` for errors; prefer `totalUsage`; correct
LD→Vercel param mapping.
> - **Examples/Manifest**
> - Update versions to `@launchdarkly/[email protected]` and
providers `@0.2.0` in examples and `.release-please-manifest.json`.
> 
> <sup>Written by [Cursor
Bugbot](https://cursor.com/dashboard?tab=bugbot) for commit
e2b5498. This will update automatically
on new commits. Configure
[here](https://cursor.com/dashboard?tab=bugbot).</sup>
<!-- /CURSOR_SUMMARY -->

---------

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: jsonbailey <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants